-
Notifications
You must be signed in to change notification settings - Fork 108
[RHOAIENG-17006] Change the scripts so that it runs with uv #1240
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Important Review skippedAuto reviews are disabled on base/target branches other than the default branch. 🗂️ Base branches to auto review (2)
Please check the settings in the CodeRabbit UI or the You can disable this status message by setting the WalkthroughThe updates transition the dependency management workflow and supporting script from using Pipfile.lock and pipenv/micropipenv to using uv.lock and the uv tool. The GitHub Actions workflow and a shell script are both refactored to focus on uv-based lock file and requirements management, replacing previous references and commands. Changes
Estimated code review effort🎯 2 (Simple) | ⏱️ ~8 minutes Suggested labels
Suggested reviewers
Poem
✨ Finishing Touches🧪 Generate unit tests
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
Documentation and Community
|
This comment was marked as outdated.
This comment was marked as outdated.
Hi @mtchoum1. Thanks for your PR. I'm waiting for a opendatahub-io member to verify that this patch is reasonable to test. If it is, they should reply with Once the patch is verified, the new status will be reflected by the I understand the commands that are listed here. Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository. |
@@ -1,6 +1,6 @@ | |||
--- | |||
# This GitHub action is meant to update the pipfile.locks | |||
name: Pipfile.locks Renewal Action | |||
name: uv.locks Renewal Action |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm not sure if should not replace the piplock-renewal gha and not just create new gha for the uvlock? 🤔
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I create a new one
@coderabbitai full review |
✅ Actions performedFull review triggered. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
🧹 Nitpick comments (3)
.github/workflows/uvlock-renewal.yaml (2)
2-3
: Update stale comment to reflect new purposeLine 2 still refers to “pipfile.locks”, which is misleading after the migration to
uv.lock
.-# This GitHub action is meant to update the pipfile.locks +# This GitHub action updates the repository-wide uv.lock file
62-65
: Prefer the official installation method for uv
pip install uv
often fails on systems without Rust tool-chain headers.
Using the official one-liner avoids that friction and yields reproducible binaries.-# Install uv -- name: Install uv - run: pip install uv +# Install uv (static build, no Rust required) +- name: Install uv + run: curl -Ls https://astral.sh/uv/install | shscripts/sync-requirements-txt.sh (1)
8-9
: Robust install check
uv --version
exits 0 even on parsing errors; usecommand -v
for a clean test and upgrade pip first to avoid TLS/SSL edge cases.-uv --version || pip install uv +command -v uv >/dev/null 2>&1 || { python -m pip install --upgrade pip && pip install uv; }
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
.github/workflows/uvlock-renewal.yaml
(3 hunks)scripts/sync-requirements-txt.sh
(1 hunks)
🧰 Additional context used
🧠 Learnings (3)
📓 Common learnings
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1325
File: jupyter/pytorch/ubi9-python-3.12/Pipfile:42-42
Timestamp: 2025-07-09T14:22:14.553Z
Learning: jiridanek requested GitHub issue creation for Pipfile.lock verification script implementation during PR #1325 review, specifically to systematize the manual verification process for dependency version consistency across all lock files using jq. Issue #1367 was created with comprehensive problem description covering manual verification challenges, detailed solution with jq-based verification script, enhanced features for CI integration, clear acceptance criteria, implementation areas breakdown, benefits analysis, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-11T11:16:05.131Z
Learning: jiridanek requested GitHub issue creation for RStudio py311 Tekton push pipelines during PR #1379 review. Issue #1384 was successfully created covering two RStudio variants (CPU and CUDA) found in manifests/base/params-latest.env, with comprehensive problem description, implementation requirements following the same pattern as other workbench pipelines, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/api/kernels/access.cgi:6-6
Timestamp: 2025-07-03T16:17:05.475Z
Learning: jiridanek requested GitHub issue creation for CGI script health-check URL configurability and timeout improvement in codeserver/ubi9-python-3.12/nginx/api/kernels/access.cgi during PR #1269 review. The request follows the established pattern of systematic code quality improvements with comprehensive issue creation covering problem description, solution details, acceptance criteria, implementation guidance, and proper context linking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh:9-12
Timestamp: 2025-07-08T19:28:15.791Z
Learning: jiridanek requested GitHub issue creation for runtime detection improvement of Python site-packages paths during PR #1333 review. Issue #1344 was created with comprehensive problem description covering hard-coded path fragility across UBI9 distributions, extensive affected files analysis including 4 de-vendor scripts, 30+ Dockerfiles with chmod operations, and 12+ pip.conf configurations, runtime detection solution using sysconfig module, implementation areas breakdown, benefits analysis, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh:4-11
Timestamp: 2025-07-03T16:04:22.695Z
Learning: jiridanek requested GitHub issue creation for shell script quality improvements in codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh during PR #1269 review. Issue #1307 was created with comprehensive problem description covering variable scoping issues, POSIX compliance concerns, multiple solution options, acceptance criteria, implementation guidance with code examples, testing approaches, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user:4-9
Timestamp: 2025-07-03T16:05:35.448Z
Learning: jiridanek requested GitHub issue creation for shell script error handling improvements in codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user during PR #1269 review. A comprehensive issue was created covering silent failures, unquoted variable expansions, missing template validation, and strict mode implementation with detailed problem descriptions, phased acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-11T11:16:05.131Z
Learning: jiridanek requested GitHub issue creation for adding RStudio py311 Tekton push pipelines during PR #1379 review, referencing existing registry entries in manifests/base/params-latest.env but missing corresponding .tekton pipeline files. A comprehensive issue was created with detailed problem description, implementation requirements following the same pattern as other workbench pipelines, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1396
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:124-127
Timestamp: 2025-07-21T15:06:04.114Z
Learning: jiridanek requested GitHub issue creation for multi-platform dependency locking investigation during PR #1396 review. Issue #1423 was successfully created with comprehensive problem description covering ARM64 wheel availability but being ignored due to AMD64-only dependency locking, root cause analysis of platform-specific pipenv limitations, immediate conditional installation solution, multi-platform locking ecosystem analysis, broader affected areas investigation, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/api/kernels/access.cgi:6-6
Timestamp: 2025-07-03T16:17:05.475Z
Learning: jiridanek requested GitHub issue creation for CGI script health-check URL configurability and timeout improvement in codeserver/ubi9-python-3.12/nginx/api/kernels/access.cgi during PR #1269 review. Issue #1312 was successfully created with comprehensive problem description covering hard-coded URL limitations, timeout protection, error handling, acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py:1-769
Timestamp: 2025-07-08T19:35:49.482Z
Learning: jiridanek requested GitHub issue creation for bootstrapper code duplication problem in runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py during PR #1333 review. After an initial failed attempt, issue #1349 was successfully created with comprehensive problem description covering maintenance overhead and consistency risks from duplicate implementations across 5 Python 3.12 runtime environments, four solution options (symlinks, import-based, template-based, direct shared import) with pros/cons analysis, clear acceptance criteria for consolidation and maintainability, step-by-step implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-28T13:57:48.243Z
Learning: Pipenv automatically detects non-interactive terminal environments (like GitHub Actions CI) and disables spinner animations without requiring PIPENV_NOSPIN=1. It checks for TTY connection and common CI environment variables to determine this behavior.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-28T13:57:48.243Z
Learning: Pipenv automatically detects non-interactive terminal environments (like GitHub Actions CI) and disables spinner animations without requiring PIPENV_NOSPIN=1. It checks for TTY connection and common CI environment variables to determine this behavior.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1247
File: .github/workflows/build-notebooks-TEMPLATE.yaml:50-53
Timestamp: 2025-07-01T14:36:52.852Z
Learning: In the opendatahub-io/notebooks repository, the test runner's Python version (configured in GitHub Actions UV setup) intentionally doesn't need to match the Python version of the container images being tested. jiridanek's team uses Python 3.12 for running tests while images may use different Python versions (like 3.11), and this approach works fine since the test code is separate from the application code running inside the containers.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-23T16:18:42.922Z
Learning: The TensorFlow ROCm Python 3.12 compatibility issue in opendatahub-io/notebooks PR #1259 was caused by using tensorflow-rocm==2.14.0.600 in Pipfile.lock which lacks Python 3.12 wheels, while the Pipfile specifies tensorflow_rocm=~=2.18.1. The solution requires updating Pipfile sources to include https://repo.radeon.com/rocm/manylinux/rocm-rel-6.4/ repository which contains tensorflow_rocm-2.18.1-cp312-cp312-manylinux_2_28_x86_64.whl and regenerating Pipfile.lock using the piplock-refresh GitHub Action.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-24T12:01:45.188Z
Learning: jiridanek requested verification of 2024.2 tag package versions against corresponding Pipfiles during PR #1496 review. The analysis revealed that 2024.2 tags contain intentionally frozen older versions (N-1 behavior) while the current PR correctly updates only the N (latest) tags to match Python 3.11 Pipfiles, demonstrating proper understanding of the repository's N-1 version freezing policy.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-26T16:19:31.295Z
Learning: In the opendatahub-io/notebooks repository, the Playwright Docker image version in `.github/workflows/build-notebooks-TEMPLATE.yaml` (format: `mcr.microsoft.com/playwright:v1.53.1-noble`) must always match the `@playwright/test` version specified in the `tests/browser/package.json` file. Both versions need to be updated together to maintain consistency between CI/CD pipeline and project dependencies.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-26T16:19:31.295Z
Learning: In the opendatahub-io/notebooks repository, the Playwright Docker image version in `.github/workflows/build-notebooks-TEMPLATE.yaml` (format: `mcr.microsoft.com/playwright:v1.53.1-noble`) must always match the `@playwright/test` version specified in the `tests/browser/package.json` file. Both versions need to be updated together to maintain consistency between CI/CD pipeline and project dependencies.
.github/workflows/uvlock-renewal.yaml (6)
Learnt from: jiridanek
PR: #1325
File: jupyter/pytorch/ubi9-python-3.12/Pipfile:42-42
Timestamp: 2025-07-09T14:22:14.553Z
Learning: jiridanek requested GitHub issue creation for Pipfile.lock verification script implementation during PR #1325 review, specifically to systematize the manual verification process for dependency version consistency across all lock files using jq. Issue #1367 was created with comprehensive problem description covering manual verification challenges, detailed solution with jq-based verification script, enhanced features for CI integration, clear acceptance criteria, implementation areas breakdown, benefits analysis, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: #1365
File: .github/workflows/insta-merge.yaml:1-1
Timestamp: 2025-07-09T14:41:52.874Z
Learning: jiridanek requested GitHub issue creation for YAML linting problems caused by the new GitHub Actions workflow in PR #1365, specifically addressing failures in the repository's YAML linter due to the insta-merge.yaml workflow file. Issue was created with comprehensive problem description covering potential indentation, quoting, line length, and structural issues, detailed root cause analysis, four solution options (fixing current file, implementing linting rules, automated formatting, repository-wide standardization), clear acceptance criteria, implementation guidance with testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: #1365
File: .github/workflows/insta-merge.yaml:1-1
Timestamp: 2025-07-09T14:41:52.874Z
Learning: jiridanek requested GitHub issue creation for YAML linting problems caused by the new GitHub Actions workflow in PR #1365, specifically addressing failures in the repository's YAML linter due to the insta-merge.yaml workflow file. Issue #1370 was successfully created with comprehensive problem description covering potential indentation, quoting, line length, and structural issues, detailed root cause analysis, four solution options (fixing current file, implementing linting rules, automated formatting, repository-wide standardization), clear acceptance criteria, implementation guidance with testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-09T09:12:44.088Z
Learning: jiridanek requested GitHub issue creation for GitHub Actions artifact naming conflict during PR #1357 review, specifically for a failing actions/upload-artifact@v4 step with 409 Conflict error. Issue was created with comprehensive problem description covering artifact naming conflicts, root cause analysis of duplicate names in concurrent workflows, four solution options (enhanced naming, overwriting, conditional uploads, matrix-aware naming) with code examples, detailed acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic CI/CD and code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: #1247
File: .github/workflows/build-notebooks-TEMPLATE.yaml:50-53
Timestamp: 2025-07-01T14:36:52.852Z
Learning: In the opendatahub-io/notebooks repository, the test runner's Python version (configured in GitHub Actions UV setup) intentionally doesn't need to match the Python version of the container images being tested. jiridanek's team uses Python 3.12 for running tests while images may use different Python versions (like 3.11), and this approach works fine since the test code is separate from the application code running inside the containers.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run pipenv lock
inside UBI9 containers with specific platform arguments (--platform=linux/amd64 --python-version 3.12
) to avoid host OS dependency conflicts when generating Pipfile.lock files.
scripts/sync-requirements-txt.sh (16)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-23T16:18:42.922Z
Learning: The TensorFlow ROCm Python 3.12 compatibility issue in opendatahub-io/notebooks PR #1259 was caused by using tensorflow-rocm==2.14.0.600 in Pipfile.lock which lacks Python 3.12 wheels, while the Pipfile specifies tensorflow_rocm=~=2.18.1. The solution requires updating Pipfile sources to include https://repo.radeon.com/rocm/manylinux/rocm-rel-6.4/ repository which contains tensorflow_rocm-2.18.1-cp312-cp312-manylinux_2_28_x86_64.whl and regenerating Pipfile.lock using the piplock-refresh GitHub Action.
Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py:1-769
Timestamp: 2025-07-08T19:35:49.482Z
Learning: jiridanek requested GitHub issue creation for bootstrapper code duplication problem in runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py during PR #1333 review. After an initial failed attempt, issue #1349 was successfully created with comprehensive problem description covering maintenance overhead and consistency risks from duplicate implementations across 5 Python 3.12 runtime environments, four solution options (symlinks, import-based, template-based, direct shared import) with pros/cons analysis, clear acceptance criteria for consolidation and maintainability, step-by-step implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: #1325
File: jupyter/pytorch/ubi9-python-3.12/Pipfile:42-42
Timestamp: 2025-07-09T14:22:14.553Z
Learning: jiridanek requested GitHub issue creation for Pipfile.lock verification script implementation during PR #1325 review, specifically to systematize the manual verification process for dependency version consistency across all lock files using jq. Issue #1367 was created with comprehensive problem description covering manual verification challenges, detailed solution with jq-based verification script, enhanced features for CI integration, clear acceptance criteria, implementation areas breakdown, benefits analysis, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run pipenv lock
inside UBI9 containers with specific platform arguments (--platform=linux/amd64 --python-version 3.12
) to avoid host OS dependency conflicts when generating Pipfile.lock files.
Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh:9-12
Timestamp: 2025-07-08T19:28:15.791Z
Learning: jiridanek requested GitHub issue creation for runtime detection improvement of Python site-packages paths during PR #1333 review. Issue #1344 was created with comprehensive problem description covering hard-coded path fragility across UBI9 distributions, extensive affected files analysis including 4 de-vendor scripts, 30+ Dockerfiles with chmod operations, and 12+ pip.conf configurations, runtime detection solution using sysconfig module, implementation areas breakdown, benefits analysis, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: #1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue. The pattern is actively managed through issues like #1243 "Improve error handling in get_expected_version() functions across test notebooks" and #1254 "Fix undefined variable error in ROCm PyTorch Python 3.12 test notebook".
Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh:9-12
Timestamp: 2025-07-08T19:28:15.791Z
Learning: jiridanek requested GitHub issue creation for runtime detection improvement of Python site-packages paths during PR #1333 review. Issue #1340 was created with comprehensive problem description covering hard-coded path fragility across UBI9 distributions, affected files analysis including de-vendor scripts and pip.conf configurations, runtime detection solution using sysconfig module, implementation areas breakdown, benefits analysis, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-09T10:24:37.290Z
Learning: jiridanek requested GitHub issue creation for TeX Live installation robustness improvements in jupyter/utils/install_pdf_deps.sh during PR #1357 review, specifically for network resilience and error handling improvements to address flaky CI failures. Issue was created with comprehensive problem description covering network download failures, mirror fallback mechanisms, retry logic, caching strategies, multiple solution options with code examples, detailed acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: #1519
File: jupyter/pytorch+llmcompressor/ubi9-python-3.11/runtime-images/llmcompressor-pytorch-ubi9-py311.json:2-9
Timestamp: 2025-07-29T16:00:31.610Z
Learning: jiridanek indicated that the current practice for runtime-images JSON files in opendatahub-io/notebooks has changed significantly from the SHA256 digest pinning pattern, and that rebasing PR #1519 would reveal the new practice which is "something completely different" from the existing pattern.
Learnt from: grdryn
PR: #1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.
Learnt from: jiridanek
PR: #1396
File: runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda:124-127
Timestamp: 2025-07-20T20:47:36.509Z
Learning: jiridanek identified that ARM64 wheels for h5py 3.14.0 are available on PyPI but being ignored due to AMD64-only dependency locking with --platform=linux/amd64. This causes unnecessary hdf5-devel package installation in ARM64 TensorFlow images when the ARM64 wheel h5py-3.14.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl could be used instead. The Pipfile.lock only contains 2 hashes for h5py, confirming limited platform consideration during lock generation.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml
file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: In the opendatahub-io/notebooks repository, TensorFlow packages with extras = ["and-cuda"]
can cause build conflicts on macOS due to platform-specific CUDA packages. When the Dockerfile installs CUDA system-wide, removing the extras and letting TensorFlow find CUDA at runtime resolves these conflicts.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-03T14:04:05.280Z
Learning: In opendatahub-io/notebooks, shell scripting robustness and efficiency issues in extension management loops (such as unquoted globs, word-splitting, and unnecessary repeated copies in run-code-server.sh) are systematically tracked and fixed by quoting variables, using cp -a --
, and toggling nullglob
, following the established pattern of operational correctness improvements.
🔇 Additional comments (1)
scripts/sync-requirements-txt.sh (1)
24-25
: Compilation may fail if the group is missing
--group jupyter-${image_name}-image
assumes every framework-variant group exists inpyproject.toml
.
Consider a fallback (or validation) step to avoid silent script termination.Would you confirm that every
${image_name}
(including new*-rocm
variants) has a matching[project.optional-dependencies]
group inpyproject.toml
? If not, the compile step will exit 1 and break CI.
# Run makefile recipe to refresh uv.lock and push changes back to the branch | ||
- name: Run make refresh-pipfilelock-files and push the changes back to the branch | ||
run: | | ||
make refresh-pipfilelock-files PYTHON_VERSION=${{ env.PYTHON_VERSION }} INCLUDE_OPT_DIRS=${{ env.INCLUDE_OPT_DIRS }} | ||
git add . | ||
git commit -m "Update Pipfile.lock files by piplock-renewal.yaml action" | ||
uv lock --python ${{ env.PYTHON_VERSION }} | ||
git add uv.lock | ||
git commit -m "Update uv.lock files by uvlock-renewal.yaml action" | ||
git push origin ${{ env.BRANCH }} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Step name, env var & empty-diff guard need cleanup
- Step title still mentions pipfilelock – rename for clarity.
INCLUDE_OPT_DIRS
is now dead code – either wire it intouv lock
(--features optional
) or drop the env.git commit
fails when no changes exist. Add a no-change guard.
- # Run makefile recipe to refresh uv.lock and push changes back to the branch
- - name: Run make refresh-pipfilelock-files and push the changes back to the branch
+ # Refresh uv.lock and push changes back to the branch
+ - name: Refresh uv.lock and push (if changed)
run: |
uv lock --python ${{ env.PYTHON_VERSION }}
- git add uv.lock
- git commit -m "Update uv.lock files by uvlock-renewal.yaml action"
+ git add uv.lock
+ if ! git diff --cached --quiet; then
+ git commit -m "Update uv.lock via uvlock-renewal.yaml workflow"
+ git push origin ${{ env.BRANCH }}
+ else
+ echo "uv.lock already up-to-date – nothing to commit"
+ fi
- git push origin ${{ env.BRANCH }}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
# Run makefile recipe to refresh uv.lock and push changes back to the branch | |
- name: Run make refresh-pipfilelock-files and push the changes back to the branch | |
run: | | |
make refresh-pipfilelock-files PYTHON_VERSION=${{ env.PYTHON_VERSION }} INCLUDE_OPT_DIRS=${{ env.INCLUDE_OPT_DIRS }} | |
git add . | |
git commit -m "Update Pipfile.lock files by piplock-renewal.yaml action" | |
uv lock --python ${{ env.PYTHON_VERSION }} | |
git add uv.lock | |
git commit -m "Update uv.lock files by uvlock-renewal.yaml action" | |
git push origin ${{ env.BRANCH }} | |
# Refresh uv.lock and push changes back to the branch | |
- name: Refresh uv.lock and push (if changed) | |
run: | | |
uv lock --python ${{ env.PYTHON_VERSION }} | |
git add uv.lock | |
if ! git diff --cached --quiet; then | |
git commit -m "Update uv.lock via uvlock-renewal.yaml workflow" | |
git push origin ${{ env.BRANCH }} | |
else | |
echo "uv.lock already up-to-date – nothing to commit" | |
fi |
🤖 Prompt for AI Agents
In .github/workflows/uvlock-renewal.yaml around lines 66 to 72, rename the step
title to accurately reflect that it refreshes uv.lock files instead of
pipfilelock. Remove the unused INCLUDE_OPT_DIRS environment variable or
integrate it properly by adding the --features optional flag to the uv lock
command if needed. Add a guard before the git commit command to check if there
are any changes staged to avoid commit failures when no changes exist.
path="${file#./*}" | ||
image_name="${path%/*/*}" | ||
python_version="${path%/*}" | ||
python_version="${python_version##*-}" | ||
|
||
if [[ "$path" == *"rocm/"* ]]; then | ||
image_name="${image_name#*/}-rocm" | ||
fi | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Incorrect image name for nested “rocm/…” paths
The current string-mangling turns pytorch/rocm/...
into rocm-rocm
.
A safer approach extracts the first directory (framework) and appends -rocm
only once.
- image_name="${path%/*/*}"
- ...
- if [[ "$path" == *"rocm/"* ]]; then
- image_name="${image_name#*/}-rocm"
- fi
+ # derive image name (first path component) and decorate for ROCm if needed
+ image_name="${path%%/*}" # e.g. pytorch, tensorflow, rocm
+ if [[ "$image_name" == "rocm" ]]; then
+ # path like rocm/tensorflow/..., so pick second component and tag with -rocm
+ image_name="$(echo "$path" | cut -d'/' -f2)-rocm"
+ fi
Committable suggestion skipped: line range outside the PR's diff.
🤖 Prompt for AI Agents
In scripts/sync-requirements-txt.sh around lines 15 to 23, the current code
incorrectly constructs the image_name for nested "rocm/..." paths, resulting in
names like "rocm-rocm". To fix this, extract only the first directory from the
path as the base image_name and append "-rocm" once if the path contains
"rocm/". This ensures the image_name correctly reflects the top-level framework
with a single "-rocm" suffix.
/lgtm Thank you so much for the excellent work Merging this into the feature branch. |
[APPROVALNOTIFIER] This PR is APPROVED This pull-request has been approved by: harshad16 The full list of commands accepted by this bot can be found here. The pull request process is described here
Needs approval from an approver in each of these files:
Approvers can indicate their approval by writing |
Description
Change the piplock-renewal.yaml --> uvlock-renewal.yaml so that they refresh uv.lock file and sync-requirements-txt.sh to update the requirements.txt files
How Has This Been Tested?
After creating my updated pyproject.toml file, I ran the uvlock-renewal.yaml script, followed by the sync-requirements-txt.sh script.
Merge criteria:
Summary by CodeRabbit